Modeling Dependency Grammar with Restricted Constraints
نویسندگان
چکیده
In this paper, parsing with dependency grammar is modeled as a constraint satisfaction problem. A restricted kind of constraints is proposed, which is simple enough to be implemented efficiently, but which is also rich enough to express a wide variety of grammatical well-formedness conditions. We give a number of examples to demonstrate how different kinds of linguistic knowledge can be encoded in this formalism.
منابع مشابه
Language modeling by stochastic dependency grammar for Japanese speech recognition
This paper describes a language modeling technique using a kind of stochastic context free grammar (stochastic dependency grammar, SDG). In this work, two improvements are done upon the general CFG based SCFG model. The rst improvement is to use a restricted grammar instead of general CFG. The dependency grammar used here is a restricted CFG that expresses modi cation between two words or phras...
متن کاملMildly Non-Projective Dependency Grammar
Syntactic representations based on word-to-word dependencies have a long tradition in descriptive linguistics, and receive considerable interest in many computational applications. However, dependency syntax has remained somewhat of an island from a formal point of view, which hampers the exchange of resources and computational methods with other syntactic traditions. In this article, we presen...
متن کاملLanguage Modeling Using a Statistical Dependency Grammar Parser
Constraint Dependency Grammar (CDG) uses constraints to determine a sentence’s grammatical structure that is represented as assignments of dependency relations to functional variables associated with each word in the sentence. This paper presents the evaluation of a statistical CDG parserbased language model (LM). This LM, when used to rescore lattices from the Wall Street Journal continuous sp...
متن کاملParsing with Soft and Hard Constraints on Dependency Length
In lexicalized phrase-structure or dependency parses, a word’s modifiers tend to fall near it in the string. We show that a crude way to use dependency length as a parsing feature can substantially improve parsing speed and accuracy in English and Chinese, with more mixed results on German. We then show similar improvements by imposing hard bounds on dependency length and (additionally) modelin...
متن کاملCapitalization Cues Improve Dependency Grammar Induction
We show that orthographic cues can be helpful for unsupervised parsing. In the Penn Treebank, transitions between upperand lowercase tokens tend to align with the boundaries of base (English) noun phrases. Such signals can be used as partial bracketing constraints to train a grammar inducer: in our experiments, directed dependency accuracy increased by 2.2% (average over 14 languages having cas...
متن کامل